Picture for Jaewoo Kang

Jaewoo Kang

ATTNSOM: Learning Cross-Isoform Attention for Cytochrome P450 Site-of-Metabolism

Add code
Jan 28, 2026
Viaarxiv icon

Benchmarking Direct Preference Optimization for Medical Large Vision-Language Models

Add code
Jan 25, 2026
Viaarxiv icon

SCRIPTMIND: Crime Script Inference and Cognitive Evaluation for LLM-based Social Engineering Scam Detection System

Add code
Jan 20, 2026
Viaarxiv icon

HiRef: Leveraging Hierarchical Ontology and Network Refinement for Robust Medication Recommendation

Add code
Aug 14, 2025
Viaarxiv icon

Outlier-Safe Pre-Training for Robust 4-Bit Quantization of Large Language Models

Add code
Jun 24, 2025
Viaarxiv icon

Med-PRM: Medical Reasoning Models with Stepwise, Guideline-verified Process Rewards

Add code
Jun 13, 2025
Viaarxiv icon

Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information

Add code
Feb 20, 2025
Figure 1 for Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information
Figure 2 for Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information
Figure 3 for Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information
Figure 4 for Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information
Viaarxiv icon

GPO-VAE: Modeling Explainable Gene Perturbation Responses utilizing GRN-Aligned Parameter Optimization

Add code
Jan 31, 2025
Viaarxiv icon

KU AIGEN ICL EDI@BC8 Track 3: Advancing Phenotype Named Entity Recognition and Normalization for Dysmorphology Physical Examination Reports

Add code
Jan 16, 2025
Figure 1 for KU AIGEN ICL EDI@BC8 Track 3: Advancing Phenotype Named Entity Recognition and Normalization for Dysmorphology Physical Examination Reports
Figure 2 for KU AIGEN ICL EDI@BC8 Track 3: Advancing Phenotype Named Entity Recognition and Normalization for Dysmorphology Physical Examination Reports
Viaarxiv icon

Monet: Mixture of Monosemantic Experts for Transformers

Add code
Dec 05, 2024
Figure 1 for Monet: Mixture of Monosemantic Experts for Transformers
Figure 2 for Monet: Mixture of Monosemantic Experts for Transformers
Figure 3 for Monet: Mixture of Monosemantic Experts for Transformers
Figure 4 for Monet: Mixture of Monosemantic Experts for Transformers
Viaarxiv icon